人脸生成(Face Generation)

在该项目中,你将使用生成式对抗网络(Generative Adversarial Nets)来生成新的人脸图像。

获取数据

该项目将使用以下数据集:

  • MNIST
  • CelebA

由于 CelebA 数据集比较复杂,而且这是你第一次使用 GANs。我们想让你先在 MNIST 数据集上测试你的 GANs 模型,以让你更快的评估所建立模型的性能。

如果你在使用 FloydHub, 请将 data_dir 设置为 "/input" 并使用 FloydHub data ID "R5KrjnANiKVhLWAkpXhNBe".

In [2]:
data_dir = './data'

# FloydHub - Use with data ID "R5KrjnANiKVhLWAkpXhNBe"
#data_dir = '/input'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

helper.download_extract('mnist', data_dir)
helper.download_extract('celeba', data_dir)
Found mnist Data
Found celeba Data

探索数据(Explore the Data)

MNIST

MNIST 是一个手写数字的图像数据集。你可以更改 show_n_images 探索此数据集。

In [3]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
import os
from glob import glob
from matplotlib import pyplot

mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'mnist/*.jpg'))[:show_n_images], 28, 28, 'L')
pyplot.imshow(helper.images_square_grid(mnist_images, 'L'), cmap='gray')
Out[3]:
<matplotlib.image.AxesImage at 0x1172c5160>

CelebA

CelebFaces Attributes Dataset (CelebA) 是一个包含 20 多万张名人图片及相关图片说明的数据集。你将用此数据集生成人脸,不会用不到相关说明。你可以更改 show_n_images 探索此数据集。

In [4]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'img_align_celeba/*.jpg'))[:show_n_images], 28, 28, 'RGB')
pyplot.imshow(helper.images_square_grid(mnist_images, 'RGB'))
Out[4]:
<matplotlib.image.AxesImage at 0x118101dd8>

预处理数据(Preprocess the Data)

由于该项目的重点是建立 GANs 模型,我们将为你预处理数据。

经过数据预处理,MNIST 和 CelebA 数据集的值在 28×28 维度图像的 [-0.5, 0.5] 范围内。CelebA 数据集中的图像裁剪了非脸部的图像部分,然后调整到 28x28 维度。

MNIST 数据集中的图像是单通道的黑白图像,CelebA 数据集中的图像是 三通道的 RGB 彩色图像

建立神经网络(Build the Neural Network)

你将通过部署以下函数来建立 GANs 的主要组成部分:

  • model_inputs
  • discriminator
  • generator
  • model_loss
  • model_opt
  • train

检查 TensorFlow 版本并获取 GPU 型号

检查你是否使用正确的 TensorFlow 版本,并获取 GPU 型号

In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer.  You are using {}'.format(tf.__version__)
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
TensorFlow Version: 1.1.0
/Users/Yifan_Zhang/miniconda3/envs/dlnd-tf-lab/lib/python3.5/site-packages/ipykernel/__main__.py:14: UserWarning: No GPU found. Please use a GPU to train your neural network.

输入(Input)

部署 model_inputs 函数以创建用于神经网络的 占位符 (TF Placeholders)。请创建以下占位符:

  • 输入图像占位符: 使用 image_widthimage_heightimage_channels 设置为 rank 4。
  • 输入 Z 占位符: 设置为 rank 2,并命名为 z_dim
  • 学习速率占位符: 设置为 rank 0。

返回占位符元组的形状为 (tensor of real input images, tensor of z data, learning rate)。

In [6]:
import problem_unittests as tests

def model_inputs(image_width, image_height, image_channels, z_dim):
    """
    Create the model inputs
    :param image_width: The input image width
    :param image_height: The input image height
    :param image_channels: The number of image channels
    :param z_dim: The dimension of Z
    :return: Tuple of (tensor of real input images, tensor of z data, learning rate)
    """
    # TODO: Implement Function
    inputs_real = tf.placeholder(tf.float32, (None, image_width, image_height, image_channels), name = 'input_real')
    inputs_z = tf.placeholder(tf.float32, (None, z_dim), name = 'input_z')
    learning_rate = tf.placeholder(tf.float32, name = 'learning_rate')
    return inputs_real, inputs_z, learning_rate


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)
Tests Passed

辨别器(Discriminator)

部署 discriminator 函数创建辨别器神经网络以辨别 images。该函数应能够重复使用神经网络中的各种变量。 在 tf.variable_scope 中使用 "discriminator" 的变量空间名来重复使用该函数中的变量。

该函数应返回形如 (tensor output of the discriminator, tensor logits of the discriminator) 的元组。

In [7]:
def discriminator(images, reuse=False):
    """
    Create the discriminator network
    :param image: Tensor of input image(s)
    :param reuse: Boolean if the weights should be reused
    :return: Tuple of (tensor output of the discriminator, tensor logits of the discriminator)
    """
    # TODO: Implement Function
    # Set up constant parameters
    alpha = 0.2
    keep_prob = 0.5
    padding = 'SAME'
    strides = 2 # Use same strides in the next three layers
    
    # Apply convolution
    with tf.variable_scope('discriminator', reuse = reuse):
        # Conv layer 1
        cl_1 = tf.layers.conv2d(images, 64, 4, strides, padding, kernel_initializer=tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32))
        # bn_1 = tf.layers.batch_normalization(cl_1, training=True)
        lrelu_1 = tf.maximum(alpha * cl_1, cl_1)
        dp_1 = tf.nn.dropout(lrelu_1, keep_prob)
        
        # Conv layer 2
        cl_2 = tf.layers.conv2d(dp_1, 128, 4, strides, padding, kernel_initializer = tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32))
        bn_2 = tf.layers.batch_normalization(cl_2, training = True)
        lrelu_2 = tf.maximum(alpha * bn_2, bn_2)
        dp_2 = tf.nn.dropout(lrelu_2, keep_prob)
        
        # Conv layer 3
        cl_3 = tf.layers.conv2d(dp_2, 256, 4, strides, padding, kernel_initializer = tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32))
        bn_3 = tf.layers.batch_normalization(cl_3, training = True)
        lrelu_3 = tf.maximum(alpha * bn_3, bn_3)
        dp_3 = tf.nn.dropout(lrelu_3, keep_prob)
        
        # Conv layer 4
        # cl_4 = tf.layers.conv2d(lrelu_3, 512, 4, strides, padding) # kernel_initializer=tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32
        # bn_4 = tf.layers.batch_normalization(cl_4, training=True)
        # lrelu_4 = tf.maximum(alpha * bn_4, bn_4)
        # dp_4 = tf.nn.dropout(lrelu_4, keep_prob)
        
        # Flatten
        flatten = tf.reshape(dp_3, (-1, 2*2*512))
        
        # Logits
        logits = tf.layers.dense(flatten, 1)
        
        # Output value
        output = tf.sigmoid(logits)
        
    return output, logits


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(discriminator, tf)
Tests Passed

生成器(Generator)

部署 generator 函数以使用 z 生成图像。该函数应能够重复使用神经网络中的各种变量。 在 tf.variable_scope 中使用 "generator" 的变量空间名来重复使用该函数中的变量。

该函数应返回所生成的 28 x 28 x out_channel_dim 维度图像。

In [10]:
def generator(z, out_channel_dim, is_train=True):
    """
    Create the generator network
    :param z: Input z
    :param out_channel_dim: The number of channels in the output image
    :param is_train: Boolean if generator is being used for training
    :return: The tensor output of the generator
    """
    # TODO: Implement Function
    # Set up constant parameters
    reuse = not is_train
    alpha = 0.2
    keep_prob = 0.5
    strides_1 = 1
    strides_2 = 2
    padding_1 = 'SAME'
    padding_2 = 'VALID'
    
    # Apply convolution
    with tf.variable_scope('generator', reuse = reuse):
        fc_1 = tf.layers.dense(z, 4*4*512)
        
        # Fully connected layer 1
        fc_1 = tf.reshape(fc_1, (-1, 4, 4, 512))
        # bn_1 = tf.layers.batch_normalization(fc_1, training = is_train)
        lrelu_1 = tf.maximum(alpha * fc_1, fc_1)
        dp_1 = tf.nn.dropout(lrelu_1, keep_prob)
        
        # Fully connected layer 2
        fc_2 = tf.layers.conv2d_transpose(dp_1, 256, 4, strides_1, padding_2, kernel_initializer=tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32))
        bn_2 = tf.layers.batch_normalization(fc_2, training = is_train)
        lrelu_2 = tf.maximum(alpha * bn_2, bn_2)
        dp_2 = tf.nn.dropout(lrelu_2, keep_prob)
        
        # Fully connected layer 3
        fc_3 = tf.layers.conv2d_transpose(lrelu_2, 128, 4, strides_2, padding_1, kernel_initializer=tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32))
        bn_3 = tf.layers.batch_normalization(fc_3, training = is_train)
        lrelu_3 = tf.maximum(alpha * bn_3, bn_3)
        dp_3 = tf.nn.dropout(lrelu_3, keep_prob)
        
        # Fully connected layer 4
        # fc_4 = tf.layers.conv2d_transpose(dp_3, 32, 4, strides_2, padding_1)
        # bn_4 = tf.layers.batch_normalization(fc_4, training=is_train)
        # lrelu_4 = tf.maximum(alpha * bn_4, bn_4)
        # dp_4 = tf.nn.dropout(lrelu_4, keep_prob)
        
        # Logits
        logits = tf.layers.conv2d_transpose(dp_3, out_channel_dim, 4, strides_2, padding_1, kernel_initializer=tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32))
        
        # Output
        output = tf.tanh(logits)
        
    return output


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(generator, tf)
Tests Passed

损失函数(Loss)

部署 model_loss 函数训练并计算 GANs 的损失。该函数应返回形如 (discriminator loss, generator loss) 的元组。

使用你已实现的函数:

  • discriminator(images, reuse=False)
  • generator(z, out_channel_dim, is_train=True)
In [11]:
def model_loss(input_real, input_z, out_channel_dim):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    # TODO: Implement Function
    # Define smooth value
    smooth = 0.1
    
    # Build up GAN model with discriminator and generator function
    # Create generator
    GAN_g = generator(input_z, out_channel_dim, is_train = True)
    
    # Create discriminator
    GAN_d_real, d_logits_real = discriminator(input_real)
    GAN_d_fake, d_logits_fake = discriminator(GAN_g, reuse = True)
    
    # Build up loss calculation
    # Calculate the loss of real and fake individually
    d_loss_real = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits = d_logits_real,
                                labels = tf.ones_like(GAN_d_real) * (1-smooth)))
    d_loss_fake = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits = d_logits_fake, 
                                labels = tf.zeros_like(GAN_d_fake) * (1-smooth)))
    
    # Calculate the combined loss of generator and discriminator
    g_loss = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits = d_logits_fake, labels = tf.ones_like(GAN_d_fake)))
    d_loss = d_loss_real + d_loss_fake
    
    return d_loss, g_loss


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_loss(model_loss)
Tests Passed

优化(Optimization)

部署 model_opt 函数实现对 GANs 的优化。使用 tf.trainable_variables 获取可训练的所有变量。通过变量空间名 discriminatorgenerator 来过滤变量。该函数应返回形如 (discriminator training operation, generator training operation) 的元组。

In [12]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training operation, generator training operation)
    """
    # TODO: Implement Function
    
    # Get all trainable variables
    t_vars = tf.trainable_variables()
    d_vars = [var for var in t_vars if var.name.startswith('discriminator')]
    g_vars = [var for var in t_vars if var.name.startswith('generator')]

    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
    g_updates  = [opt for opt in update_ops if opt.name.startswith('generator')]

    with tf.control_dependencies(g_updates):
        d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
        g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(g_loss, var_list=g_vars)

        return d_train_opt, g_train_opt

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_opt(model_opt, tf)
Tests Passed

训练神经网络(Neural Network Training)

输出显示

使用该函数可以显示生成器 (Generator) 在训练过程中的当前输出,这会帮你评估 GANs 模型的训练程度。

In [13]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

def show_generator_output(sess, n_images, input_z, out_channel_dim, image_mode):
    """
    Show example output for the generator
    :param sess: TensorFlow session
    :param n_images: Number of Images to display
    :param input_z: Input Z Tensor
    :param out_channel_dim: The number of channels in the output image
    :param image_mode: The mode to use for images ("RGB" or "L")
    """
    cmap = None if image_mode == 'RGB' else 'gray'
    z_dim = input_z.get_shape().as_list()[-1]
    example_z = np.random.uniform(-1, 1, size=[n_images, z_dim])

    samples = sess.run(
        generator(input_z, out_channel_dim, False),
        feed_dict={input_z: example_z})

    images_grid = helper.images_square_grid(samples, image_mode)
    pyplot.imshow(images_grid, cmap=cmap)
    pyplot.show()

训练

部署 train 函数以建立并训练 GANs 模型。记得使用以下你已完成的函数:

  • model_inputs(image_width, image_height, image_channels, z_dim)
  • model_loss(input_real, input_z, out_channel_dim)
  • model_opt(d_loss, g_loss, learning_rate, beta1)

使用 show_generator_output 函数显示 generator 在训练过程中的输出。

注意:在每个批次 (batch) 中运行 show_generator_output 函数会显著增加训练时间与该 notebook 的体积。推荐每 100 批次输出一次 generator 的输出。

In [14]:
def train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode):
    """
    Train the GAN
    :param epoch_count: Number of epochs
    :param batch_size: Batch Size
    :param z_dim: Z dimension
    :param learning_rate: Learning Rate
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :param get_batches: Function to get batches
    :param data_shape: Shape of the data
    :param data_image_mode: The image mode to use for images ("RGB" or "L")
    """
    # TODO: Build Model
    # Get image parameters
    image_batches, image_width, image_height, image_channels = data_shape
    
    # Extract data from data_shape
    input_real, input_z, lr = model_inputs(image_width, image_height, image_channels, z_dim)
    
    # Apply loss function
    d_loss, g_loss = model_loss(input_real, input_z, image_channels)
    
    # Apply optimization function
    d_train_opt, g_train_opt = model_opt(d_loss, g_loss, lr, beta1)
    
    # Parameters initialization
    count = 0
    show_cycle = 100
    loss_cycle = 10
    images_num = 25
    
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for epoch_i in range(epoch_count):
            for batch_images in get_batches(batch_size):
                # TODO: Train Model
                
                # Skip the first round and iterate
                count += 1
                
                # Range from [-0.5, 0.5] tp [-1, 1]
                batch_images = batch_images * 2
                
                # Noise adding for generator
                batch_z = np.random.uniform(-1, 1, size = (batch_size, z_dim))
                
                # Optimizing
                _ = sess.run(d_train_opt, feed_dict = {input_real: batch_images, input_z: batch_z, lr: learning_rate})
                _ = sess.run(g_train_opt, feed_dict = {input_real: batch_images, input_z: batch_z, lr: learning_rate})
                
                # Print generator output and overall training parameters
                if count % show_cycle == 0:
                    show_generator_output(sess, images_num, input_z, image_channels, data_image_mode)
                
                # Print the loss of discriminator and generator
                if count % loss_cycle == 0:
                    d_loss_training = d_loss.eval({input_z: batch_z, input_real: batch_images})
                    g_loss_training = g_loss.eval({input_z: batch_z, input_real: batch_images})
                    print("Epoch {}/{}...".format(epoch_i + 1, epochs),
                          "Discriminator Loss: {:.6f}...".format(d_loss_training),
                          "Generator Loss: {:.6f}".format(g_loss_training))

MNIST

在 MNIST 上测试你的 GANs 模型。经过 2 次迭代,GANs 应该能够生成类似手写数字的图像。确保生成器 (generator) 低于辨别器 (discriminator) 的损失,或接近 0。

In [ ]:
batch_size = 64
z_dim = 64
learning_rate = 0.0002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 2

mnist_dataset = helper.Dataset('mnist', glob(os.path.join(data_dir, 'mnist/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, mnist_dataset.get_batches,
          mnist_dataset.shape, mnist_dataset.image_mode)
Epoch 1/2... Discriminator Loss: 1.604914... Generator Loss: 1.279916
Epoch 1/2... Discriminator Loss: 1.299356... Generator Loss: 1.538565
Epoch 1/2... Discriminator Loss: 1.688670... Generator Loss: 1.117870
Epoch 1/2... Discriminator Loss: 1.609657... Generator Loss: 1.029962
Epoch 1/2... Discriminator Loss: 1.428542... Generator Loss: 1.257662
Epoch 1/2... Discriminator Loss: 1.653016... Generator Loss: 1.119262
Epoch 1/2... Discriminator Loss: 1.445308... Generator Loss: 1.298054
Epoch 1/2... Discriminator Loss: 1.580945... Generator Loss: 1.040785
Epoch 1/2... Discriminator Loss: 1.093537... Generator Loss: 1.588598
Epoch 1/2... Discriminator Loss: 1.605426... Generator Loss: 1.384242
Epoch 1/2... Discriminator Loss: 1.012639... Generator Loss: 1.916749
Epoch 1/2... Discriminator Loss: 1.714504... Generator Loss: 1.217544
Epoch 1/2... Discriminator Loss: 0.702727... Generator Loss: 2.242279
Epoch 1/2... Discriminator Loss: 1.136490... Generator Loss: 1.648259
Epoch 1/2... Discriminator Loss: 1.637527... Generator Loss: 1.300110
Epoch 1/2... Discriminator Loss: 1.091530... Generator Loss: 1.730206
Epoch 1/2... Discriminator Loss: 0.747029... Generator Loss: 2.314740
Epoch 1/2... Discriminator Loss: 1.235922... Generator Loss: 1.801608
Epoch 1/2... Discriminator Loss: 1.412113... Generator Loss: 1.486733
Epoch 1/2... Discriminator Loss: 0.825160... Generator Loss: 2.243394
Epoch 1/2... Discriminator Loss: 1.112850... Generator Loss: 1.838652
Epoch 1/2... Discriminator Loss: 1.449053... Generator Loss: 1.646398
Epoch 1/2... Discriminator Loss: 1.056025... Generator Loss: 1.752472
Epoch 1/2... Discriminator Loss: 0.786922... Generator Loss: 2.077204
Epoch 1/2... Discriminator Loss: 1.218820... Generator Loss: 1.644297
Epoch 1/2... Discriminator Loss: 1.236493... Generator Loss: 1.478200
Epoch 1/2... Discriminator Loss: 1.411699... Generator Loss: 1.515090
Epoch 1/2... Discriminator Loss: 1.523664... Generator Loss: 1.237330
Epoch 1/2... Discriminator Loss: 1.247432... Generator Loss: 1.438422
Epoch 1/2... Discriminator Loss: 1.154970... Generator Loss: 1.509094
Epoch 1/2... Discriminator Loss: 1.343203... Generator Loss: 1.220544
Epoch 1/2... Discriminator Loss: 1.058442... Generator Loss: 1.557774
Epoch 1/2... Discriminator Loss: 1.258018... Generator Loss: 1.316140
Epoch 1/2... Discriminator Loss: 1.061781... Generator Loss: 1.693125
Epoch 1/2... Discriminator Loss: 1.166595... Generator Loss: 1.360351
Epoch 1/2... Discriminator Loss: 1.317789... Generator Loss: 1.248269
Epoch 1/2... Discriminator Loss: 1.270132... Generator Loss: 1.301157
Epoch 1/2... Discriminator Loss: 1.147510... Generator Loss: 1.443870
Epoch 1/2... Discriminator Loss: 1.040720... Generator Loss: 1.494691
Epoch 1/2... Discriminator Loss: 1.446682... Generator Loss: 1.282449
Epoch 1/2... Discriminator Loss: 1.160189... Generator Loss: 1.416440
Epoch 1/2... Discriminator Loss: 1.265335... Generator Loss: 1.339147
Epoch 1/2... Discriminator Loss: 1.274018... Generator Loss: 1.512576
Epoch 1/2... Discriminator Loss: 1.080044... Generator Loss: 1.504069
Epoch 1/2... Discriminator Loss: 1.296424... Generator Loss: 1.199057
Epoch 1/2... Discriminator Loss: 1.057001... Generator Loss: 1.325861
Epoch 1/2... Discriminator Loss: 1.490105... Generator Loss: 1.206276
Epoch 1/2... Discriminator Loss: 1.136798... Generator Loss: 1.560716
Epoch 1/2... Discriminator Loss: 1.524322... Generator Loss: 0.901498
Epoch 1/2... Discriminator Loss: 1.363011... Generator Loss: 1.284842
Epoch 1/2... Discriminator Loss: 1.274179... Generator Loss: 1.102028
Epoch 1/2... Discriminator Loss: 1.291341... Generator Loss: 1.154704
Epoch 1/2... Discriminator Loss: 1.238306... Generator Loss: 1.301483
Epoch 1/2... Discriminator Loss: 1.196878... Generator Loss: 1.410108
Epoch 1/2... Discriminator Loss: 1.409432... Generator Loss: 0.960063
Epoch 1/2... Discriminator Loss: 1.286826... Generator Loss: 1.275395
Epoch 1/2... Discriminator Loss: 1.495419... Generator Loss: 1.077674
Epoch 1/2... Discriminator Loss: 1.025156... Generator Loss: 1.510839
Epoch 1/2... Discriminator Loss: 1.235408... Generator Loss: 1.261659
Epoch 1/2... Discriminator Loss: 1.257465... Generator Loss: 1.043735
Epoch 1/2... Discriminator Loss: 1.170085... Generator Loss: 1.295919
Epoch 1/2... Discriminator Loss: 1.150943... Generator Loss: 1.182698
Epoch 1/2... Discriminator Loss: 1.202075... Generator Loss: 1.225091
Epoch 1/2... Discriminator Loss: 1.231464... Generator Loss: 1.144966
Epoch 1/2... Discriminator Loss: 1.141988... Generator Loss: 1.298892
Epoch 1/2... Discriminator Loss: 1.215821... Generator Loss: 1.089350
Epoch 1/2... Discriminator Loss: 1.110618... Generator Loss: 1.316987
Epoch 1/2... Discriminator Loss: 1.357490... Generator Loss: 0.955178
Epoch 1/2... Discriminator Loss: 1.223305... Generator Loss: 1.310291
Epoch 1/2... Discriminator Loss: 1.328068... Generator Loss: 1.223616
Epoch 1/2... Discriminator Loss: 1.282289... Generator Loss: 1.038253
Epoch 1/2... Discriminator Loss: 1.288320... Generator Loss: 1.292413
Epoch 1/2... Discriminator Loss: 1.219030... Generator Loss: 1.214847
Epoch 1/2... Discriminator Loss: 1.182860... Generator Loss: 1.165306
Epoch 1/2... Discriminator Loss: 1.234440... Generator Loss: 1.061578
Epoch 1/2... Discriminator Loss: 1.159939... Generator Loss: 1.155460
Epoch 1/2... Discriminator Loss: 1.266081... Generator Loss: 1.200630
Epoch 1/2... Discriminator Loss: 1.272913... Generator Loss: 1.108363
Epoch 1/2... Discriminator Loss: 1.198495... Generator Loss: 1.227605
Epoch 1/2... Discriminator Loss: 1.181072... Generator Loss: 1.053581
Epoch 1/2... Discriminator Loss: 1.272859... Generator Loss: 1.160853
Epoch 1/2... Discriminator Loss: 1.169354... Generator Loss: 1.259187
Epoch 1/2... Discriminator Loss: 1.192433... Generator Loss: 1.172698
Epoch 1/2... Discriminator Loss: 1.264416... Generator Loss: 0.991731
Epoch 1/2... Discriminator Loss: 1.170921... Generator Loss: 1.113264
Epoch 1/2... Discriminator Loss: 1.163910... Generator Loss: 1.269730
Epoch 1/2... Discriminator Loss: 1.421452... Generator Loss: 0.904851
Epoch 1/2... Discriminator Loss: 1.337422... Generator Loss: 1.013645
Epoch 1/2... Discriminator Loss: 1.318055... Generator Loss: 1.062196
Epoch 1/2... Discriminator Loss: 1.384341... Generator Loss: 1.152845
Epoch 1/2... Discriminator Loss: 1.316102... Generator Loss: 0.978980
Epoch 1/2... Discriminator Loss: 1.105295... Generator Loss: 1.380493
Epoch 1/2... Discriminator Loss: 1.208681... Generator Loss: 1.148210
Epoch 2/2... Discriminator Loss: 1.346642... Generator Loss: 1.017039
Epoch 2/2... Discriminator Loss: 1.182484... Generator Loss: 1.201969
Epoch 2/2... Discriminator Loss: 1.171806... Generator Loss: 1.290731
Epoch 2/2... Discriminator Loss: 1.222891... Generator Loss: 1.125331
Epoch 2/2... Discriminator Loss: 1.344744... Generator Loss: 1.153251
Epoch 2/2... Discriminator Loss: 1.255025... Generator Loss: 1.271450
Epoch 2/2... Discriminator Loss: 1.257710... Generator Loss: 1.103608
Epoch 2/2... Discriminator Loss: 1.265081... Generator Loss: 0.951639
Epoch 2/2... Discriminator Loss: 1.186053... Generator Loss: 1.259645
Epoch 2/2... Discriminator Loss: 1.272644... Generator Loss: 1.314603
Epoch 2/2... Discriminator Loss: 1.237227... Generator Loss: 1.027675
Epoch 2/2... Discriminator Loss: 1.072619... Generator Loss: 1.157752
Epoch 2/2... Discriminator Loss: 1.161037... Generator Loss: 1.071616
Epoch 2/2... Discriminator Loss: 1.254390... Generator Loss: 1.093672
Epoch 2/2... Discriminator Loss: 1.257340... Generator Loss: 1.013971
Epoch 2/2... Discriminator Loss: 1.234052... Generator Loss: 1.011156
Epoch 2/2... Discriminator Loss: 1.191515... Generator Loss: 1.204467
Epoch 2/2... Discriminator Loss: 1.136162... Generator Loss: 1.289822
Epoch 2/2... Discriminator Loss: 1.190848... Generator Loss: 1.034317
Epoch 2/2... Discriminator Loss: 1.233872... Generator Loss: 1.038578
Epoch 2/2... Discriminator Loss: 1.197498... Generator Loss: 1.207440
Epoch 2/2... Discriminator Loss: 1.223646... Generator Loss: 1.072047
Epoch 2/2... Discriminator Loss: 1.229332... Generator Loss: 1.093174
Epoch 2/2... Discriminator Loss: 1.283610... Generator Loss: 1.195230
Epoch 2/2... Discriminator Loss: 1.296554... Generator Loss: 1.008433
Epoch 2/2... Discriminator Loss: 1.201413... Generator Loss: 1.189223
Epoch 2/2... Discriminator Loss: 1.292985... Generator Loss: 1.089901
Epoch 2/2... Discriminator Loss: 1.412044... Generator Loss: 1.009174
Epoch 2/2... Discriminator Loss: 1.308290... Generator Loss: 1.007923
Epoch 2/2... Discriminator Loss: 1.155983... Generator Loss: 0.961033
Epoch 2/2... Discriminator Loss: 1.171526... Generator Loss: 1.217743
Epoch 2/2... Discriminator Loss: 1.297567... Generator Loss: 0.873163
Epoch 2/2... Discriminator Loss: 1.444724... Generator Loss: 0.976303
Epoch 2/2... Discriminator Loss: 1.099082... Generator Loss: 1.184047
Epoch 2/2... Discriminator Loss: 1.291957... Generator Loss: 1.037948
Epoch 2/2... Discriminator Loss: 1.262205... Generator Loss: 0.951935
Epoch 2/2... Discriminator Loss: 1.313673... Generator Loss: 1.041462
Epoch 2/2... Discriminator Loss: 1.300065... Generator Loss: 1.112880
Epoch 2/2... Discriminator Loss: 1.318275... Generator Loss: 0.918531
Epoch 2/2... Discriminator Loss: 1.267502... Generator Loss: 1.067249
Epoch 2/2... Discriminator Loss: 1.331673... Generator Loss: 1.077766
Epoch 2/2... Discriminator Loss: 1.308669... Generator Loss: 1.036985
Epoch 2/2... Discriminator Loss: 1.097217... Generator Loss: 1.128410
Epoch 2/2... Discriminator Loss: 1.297734... Generator Loss: 1.165633
Epoch 2/2... Discriminator Loss: 1.210748... Generator Loss: 1.106629
Epoch 2/2... Discriminator Loss: 1.235231... Generator Loss: 1.066029
Epoch 2/2... Discriminator Loss: 1.405281... Generator Loss: 0.759412
Epoch 2/2... Discriminator Loss: 1.351671... Generator Loss: 0.971943
Epoch 2/2... Discriminator Loss: 1.347569... Generator Loss: 0.837978
Epoch 2/2... Discriminator Loss: 1.252576... Generator Loss: 1.004226
Epoch 2/2... Discriminator Loss: 1.238359... Generator Loss: 1.202645
Epoch 2/2... Discriminator Loss: 1.358992... Generator Loss: 0.918092
Epoch 2/2... Discriminator Loss: 1.310870... Generator Loss: 0.927614
Epoch 2/2... Discriminator Loss: 1.099512... Generator Loss: 0.962627
Epoch 2/2... Discriminator Loss: 1.064160... Generator Loss: 1.042042
Epoch 2/2... Discriminator Loss: 1.236705... Generator Loss: 0.960878
Epoch 2/2... Discriminator Loss: 1.373416... Generator Loss: 1.012704
Epoch 2/2... Discriminator Loss: 1.101848... Generator Loss: 0.979282
Epoch 2/2... Discriminator Loss: 1.232760... Generator Loss: 1.185347
Epoch 2/2... Discriminator Loss: 1.187883... Generator Loss: 1.144643
Epoch 2/2... Discriminator Loss: 1.218233... Generator Loss: 1.126355
Epoch 2/2... Discriminator Loss: 1.234507... Generator Loss: 1.193865
Epoch 2/2... Discriminator Loss: 1.233187... Generator Loss: 1.091302
Epoch 2/2... Discriminator Loss: 1.379818... Generator Loss: 1.013044
Epoch 2/2... Discriminator Loss: 1.270846... Generator Loss: 0.993854
Epoch 2/2... Discriminator Loss: 1.246320... Generator Loss: 1.083502
Epoch 2/2... Discriminator Loss: 1.344592... Generator Loss: 0.997963
Epoch 2/2... Discriminator Loss: 1.347362... Generator Loss: 0.915505
Epoch 2/2... Discriminator Loss: 1.366348... Generator Loss: 1.018091
Epoch 2/2... Discriminator Loss: 1.326696... Generator Loss: 0.906397
Epoch 2/2... Discriminator Loss: 1.394639... Generator Loss: 0.964684
Epoch 2/2... Discriminator Loss: 1.379666... Generator Loss: 0.960765
Epoch 2/2... Discriminator Loss: 1.154594... Generator Loss: 1.041233
Epoch 2/2... Discriminator Loss: 1.251478... Generator Loss: 1.095408
Epoch 2/2... Discriminator Loss: 1.312592... Generator Loss: 1.003271
Epoch 2/2... Discriminator Loss: 1.201886... Generator Loss: 1.064374
Epoch 2/2... Discriminator Loss: 1.260159... Generator Loss: 1.022630
Epoch 2/2... Discriminator Loss: 1.273733... Generator Loss: 1.135529
Epoch 2/2... Discriminator Loss: 1.254608... Generator Loss: 1.045188
Epoch 2/2... Discriminator Loss: 1.260392... Generator Loss: 0.997694
Epoch 2/2... Discriminator Loss: 1.275864... Generator Loss: 1.023085
Epoch 2/2... Discriminator Loss: 1.238252... Generator Loss: 1.058661
Epoch 2/2... Discriminator Loss: 1.155531... Generator Loss: 1.159663
Epoch 2/2... Discriminator Loss: 1.307201... Generator Loss: 1.055439
Epoch 2/2... Discriminator Loss: 1.282863... Generator Loss: 0.889102
Epoch 2/2... Discriminator Loss: 1.075623... Generator Loss: 1.073462
Epoch 2/2... Discriminator Loss: 1.205230... Generator Loss: 1.080411
Epoch 2/2... Discriminator Loss: 1.525225... Generator Loss: 1.000587
Epoch 2/2... Discriminator Loss: 1.282233... Generator Loss: 0.884783
Epoch 2/2... Discriminator Loss: 1.083598... Generator Loss: 1.153311
Epoch 2/2... Discriminator Loss: 1.282307... Generator Loss: 0.864147
Epoch 2/2... Discriminator Loss: 1.394463... Generator Loss: 0.882575
Epoch 2/2... Discriminator Loss: 1.339811... Generator Loss: 0.872985
Epoch 2/2... Discriminator Loss: 1.275935... Generator Loss: 1.097468

CelebA

在 CelebA 上运行你的 GANs 模型。在一般的GPU上运行每次迭代大约需要 20 分钟。你可以运行整个迭代,或者当 GANs 开始产生真实人脸图像时停止它。

In [ ]:
batch_size = 32
z_dim = 128
learning_rate = 0.0002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 1.026222... Generator Loss: 1.576823
Epoch 1/1... Discriminator Loss: 0.944939... Generator Loss: 2.002959
Epoch 1/1... Discriminator Loss: 0.983634... Generator Loss: 1.930697
Epoch 1/1... Discriminator Loss: 1.066821... Generator Loss: 1.738701
Epoch 1/1... Discriminator Loss: 0.911999... Generator Loss: 1.680609
Epoch 1/1... Discriminator Loss: 0.935077... Generator Loss: 2.179543
Epoch 1/1... Discriminator Loss: 0.914264... Generator Loss: 2.035017
Epoch 1/1... Discriminator Loss: 1.021143... Generator Loss: 2.149883
Epoch 1/1... Discriminator Loss: 0.853487... Generator Loss: 2.185961
Epoch 1/1... Discriminator Loss: 1.318330... Generator Loss: 1.548982
Epoch 1/1... Discriminator Loss: 0.879377... Generator Loss: 2.062196
Epoch 1/1... Discriminator Loss: 0.814423... Generator Loss: 2.360537
Epoch 1/1... Discriminator Loss: 0.728880... Generator Loss: 2.492301
Epoch 1/1... Discriminator Loss: 0.762704... Generator Loss: 2.704533
Epoch 1/1... Discriminator Loss: 0.601116... Generator Loss: 2.741834
Epoch 1/1... Discriminator Loss: 0.939723... Generator Loss: 2.290982
Epoch 1/1... Discriminator Loss: 0.677573... Generator Loss: 2.739143
Epoch 1/1... Discriminator Loss: 0.588772... Generator Loss: 3.244595
Epoch 1/1... Discriminator Loss: 0.759755... Generator Loss: 2.616587
Epoch 1/1... Discriminator Loss: 0.586797... Generator Loss: 3.385798
Epoch 1/1... Discriminator Loss: 0.542055... Generator Loss: 3.505713
Epoch 1/1... Discriminator Loss: 0.620938... Generator Loss: 3.338686
Epoch 1/1... Discriminator Loss: 0.594550... Generator Loss: 3.401645
Epoch 1/1... Discriminator Loss: 0.531798... Generator Loss: 3.392949
Epoch 1/1... Discriminator Loss: 0.565750... Generator Loss: 3.312060
Epoch 1/1... Discriminator Loss: 0.534344... Generator Loss: 3.266236
Epoch 1/1... Discriminator Loss: 0.587475... Generator Loss: 3.316948
Epoch 1/1... Discriminator Loss: 0.651375... Generator Loss: 2.754675
Epoch 1/1... Discriminator Loss: 0.822349... Generator Loss: 2.221838
Epoch 1/1... Discriminator Loss: 0.614405... Generator Loss: 3.690963
Epoch 1/1... Discriminator Loss: 0.511952... Generator Loss: 3.183440
Epoch 1/1... Discriminator Loss: 0.558194... Generator Loss: 4.331664
Epoch 1/1... Discriminator Loss: 0.503294... Generator Loss: 3.773515
Epoch 1/1... Discriminator Loss: 0.602412... Generator Loss: 2.892153
Epoch 1/1... Discriminator Loss: 0.637676... Generator Loss: 2.975928
Epoch 1/1... Discriminator Loss: 0.589570... Generator Loss: 3.027838
Epoch 1/1... Discriminator Loss: 0.537478... Generator Loss: 3.321502
Epoch 1/1... Discriminator Loss: 0.561278... Generator Loss: 2.936135
Epoch 1/1... Discriminator Loss: 0.581006... Generator Loss: 3.310052
Epoch 1/1... Discriminator Loss: 0.540020... Generator Loss: 3.305300
Epoch 1/1... Discriminator Loss: 0.501893... Generator Loss: 4.211509
Epoch 1/1... Discriminator Loss: 0.492986... Generator Loss: 3.711268
Epoch 1/1... Discriminator Loss: 0.521466... Generator Loss: 2.482068
Epoch 1/1... Discriminator Loss: 0.485873... Generator Loss: 4.205866
Epoch 1/1... Discriminator Loss: 0.564082... Generator Loss: 3.224827
Epoch 1/1... Discriminator Loss: 0.593205... Generator Loss: 3.551163
Epoch 1/1... Discriminator Loss: 0.749322... Generator Loss: 2.538280
Epoch 1/1... Discriminator Loss: 0.691892... Generator Loss: 2.633636
Epoch 1/1... Discriminator Loss: 0.591250... Generator Loss: 3.207402
Epoch 1/1... Discriminator Loss: 0.490381... Generator Loss: 3.262053
Epoch 1/1... Discriminator Loss: 0.608054... Generator Loss: 3.117069
Epoch 1/1... Discriminator Loss: 0.476027... Generator Loss: 3.067981
Epoch 1/1... Discriminator Loss: 0.679514... Generator Loss: 2.398774
Epoch 1/1... Discriminator Loss: 0.554106... Generator Loss: 3.172054
Epoch 1/1... Discriminator Loss: 0.614704... Generator Loss: 3.470090
Epoch 1/1... Discriminator Loss: 0.871653... Generator Loss: 2.594450
Epoch 1/1... Discriminator Loss: 0.673521... Generator Loss: 2.274489
Epoch 1/1... Discriminator Loss: 0.704326... Generator Loss: 2.293480
Epoch 1/1... Discriminator Loss: 0.808180... Generator Loss: 1.977411
Epoch 1/1... Discriminator Loss: 0.648245... Generator Loss: 2.567987
Epoch 1/1... Discriminator Loss: 0.860177... Generator Loss: 2.357497
Epoch 1/1... Discriminator Loss: 1.244841... Generator Loss: 1.628627
Epoch 1/1... Discriminator Loss: 1.199928... Generator Loss: 1.729249
Epoch 1/1... Discriminator Loss: 0.821830... Generator Loss: 2.251595
Epoch 1/1... Discriminator Loss: 1.035503... Generator Loss: 1.487041
Epoch 1/1... Discriminator Loss: 0.675031... Generator Loss: 2.381469
Epoch 1/1... Discriminator Loss: 1.077355... Generator Loss: 1.975561
Epoch 1/1... Discriminator Loss: 0.993738... Generator Loss: 1.923751
Epoch 1/1... Discriminator Loss: 0.853836... Generator Loss: 1.939278
Epoch 1/1... Discriminator Loss: 0.901276... Generator Loss: 2.302012
Epoch 1/1... Discriminator Loss: 1.404571... Generator Loss: 1.622888
Epoch 1/1... Discriminator Loss: 0.799725... Generator Loss: 2.030067
Epoch 1/1... Discriminator Loss: 0.806809... Generator Loss: 1.950109
Epoch 1/1... Discriminator Loss: 0.558529... Generator Loss: 2.647945
Epoch 1/1... Discriminator Loss: 0.569801... Generator Loss: 2.866792
Epoch 1/1... Discriminator Loss: 0.844765... Generator Loss: 2.483552
Epoch 1/1... Discriminator Loss: 0.791982... Generator Loss: 2.099220
Epoch 1/1... Discriminator Loss: 0.596390... Generator Loss: 2.362989
Epoch 1/1... Discriminator Loss: 0.918700... Generator Loss: 1.624602
Epoch 1/1... Discriminator Loss: 1.059910... Generator Loss: 1.447926
Epoch 1/1... Discriminator Loss: 0.745097... Generator Loss: 2.119694
Epoch 1/1... Discriminator Loss: 0.941731... Generator Loss: 1.847133
Epoch 1/1... Discriminator Loss: 0.857473... Generator Loss: 2.075270
Epoch 1/1... Discriminator Loss: 0.879653... Generator Loss: 2.187656
Epoch 1/1... Discriminator Loss: 0.620396... Generator Loss: 3.049852
Epoch 1/1... Discriminator Loss: 0.857568... Generator Loss: 2.038925
Epoch 1/1... Discriminator Loss: 0.666918... Generator Loss: 2.901414
Epoch 1/1... Discriminator Loss: 0.590990... Generator Loss: 2.638286
Epoch 1/1... Discriminator Loss: 0.630978... Generator Loss: 2.490016
Epoch 1/1... Discriminator Loss: 0.800674... Generator Loss: 2.141237
Epoch 1/1... Discriminator Loss: 0.789211... Generator Loss: 2.098143
Epoch 1/1... Discriminator Loss: 0.855618... Generator Loss: 1.864619
Epoch 1/1... Discriminator Loss: 0.985231... Generator Loss: 1.534156
Epoch 1/1... Discriminator Loss: 0.989105... Generator Loss: 2.110210
Epoch 1/1... Discriminator Loss: 0.880515... Generator Loss: 1.972726
Epoch 1/1... Discriminator Loss: 0.655534... Generator Loss: 2.483696
Epoch 1/1... Discriminator Loss: 0.670937... Generator Loss: 2.350441
Epoch 1/1... Discriminator Loss: 1.063375... Generator Loss: 1.334805
Epoch 1/1... Discriminator Loss: 0.885725... Generator Loss: 1.764034
Epoch 1/1... Discriminator Loss: 1.040852... Generator Loss: 1.310303
Epoch 1/1... Discriminator Loss: 0.950389... Generator Loss: 2.146494
Epoch 1/1... Discriminator Loss: 0.903066... Generator Loss: 1.969080
Epoch 1/1... Discriminator Loss: 0.783192... Generator Loss: 2.009815
Epoch 1/1... Discriminator Loss: 0.740456... Generator Loss: 2.260701
Epoch 1/1... Discriminator Loss: 1.280190... Generator Loss: 1.772741
Epoch 1/1... Discriminator Loss: 0.949106... Generator Loss: 2.037850
Epoch 1/1... Discriminator Loss: 0.957027... Generator Loss: 2.322520
Epoch 1/1... Discriminator Loss: 0.895331... Generator Loss: 1.590264
Epoch 1/1... Discriminator Loss: 1.344790... Generator Loss: 1.571701
Epoch 1/1... Discriminator Loss: 0.932994... Generator Loss: 1.904819
Epoch 1/1... Discriminator Loss: 1.572867... Generator Loss: 1.379416
Epoch 1/1... Discriminator Loss: 0.709349... Generator Loss: 1.720765
Epoch 1/1... Discriminator Loss: 0.892798... Generator Loss: 1.954763
Epoch 1/1... Discriminator Loss: 0.909022... Generator Loss: 1.652361
Epoch 1/1... Discriminator Loss: 0.898015... Generator Loss: 1.508886
Epoch 1/1... Discriminator Loss: 1.105249... Generator Loss: 1.917691
Epoch 1/1... Discriminator Loss: 1.213475... Generator Loss: 1.451662
Epoch 1/1... Discriminator Loss: 1.092339... Generator Loss: 1.704646
Epoch 1/1... Discriminator Loss: 0.987046... Generator Loss: 1.534092
Epoch 1/1... Discriminator Loss: 0.839736... Generator Loss: 2.051409
Epoch 1/1... Discriminator Loss: 0.790330... Generator Loss: 2.381116
Epoch 1/1... Discriminator Loss: 1.032876... Generator Loss: 1.719739
Epoch 1/1... Discriminator Loss: 0.918337... Generator Loss: 1.570817
Epoch 1/1... Discriminator Loss: 0.871964... Generator Loss: 1.400765
Epoch 1/1... Discriminator Loss: 1.223170... Generator Loss: 1.198321
Epoch 1/1... Discriminator Loss: 1.133065... Generator Loss: 1.767488
Epoch 1/1... Discriminator Loss: 1.131336... Generator Loss: 2.097357
Epoch 1/1... Discriminator Loss: 1.140982... Generator Loss: 1.441027
Epoch 1/1... Discriminator Loss: 0.683199... Generator Loss: 2.191739
Epoch 1/1... Discriminator Loss: 0.897638... Generator Loss: 1.759235
Epoch 1/1... Discriminator Loss: 0.780249... Generator Loss: 1.953306
Epoch 1/1... Discriminator Loss: 1.091882... Generator Loss: 1.865333
Epoch 1/1... Discriminator Loss: 1.011552... Generator Loss: 1.333611
Epoch 1/1... Discriminator Loss: 1.155035... Generator Loss: 1.394665
Epoch 1/1... Discriminator Loss: 1.135500... Generator Loss: 1.257528
Epoch 1/1... Discriminator Loss: 0.967493... Generator Loss: 1.678399
Epoch 1/1... Discriminator Loss: 0.990153... Generator Loss: 2.450830
Epoch 1/1... Discriminator Loss: 1.149639... Generator Loss: 1.282738
Epoch 1/1... Discriminator Loss: 1.088259... Generator Loss: 1.280260
Epoch 1/1... Discriminator Loss: 1.024440... Generator Loss: 1.585227
Epoch 1/1... Discriminator Loss: 1.363068... Generator Loss: 1.274153
Epoch 1/1... Discriminator Loss: 1.101205... Generator Loss: 1.292604
Epoch 1/1... Discriminator Loss: 1.209169... Generator Loss: 1.288806
Epoch 1/1... Discriminator Loss: 1.151459... Generator Loss: 1.445863
Epoch 1/1... Discriminator Loss: 0.987544... Generator Loss: 1.520535
Epoch 1/1... Discriminator Loss: 1.001664... Generator Loss: 1.263841
Epoch 1/1... Discriminator Loss: 1.228490... Generator Loss: 1.111937
Epoch 1/1... Discriminator Loss: 1.005030... Generator Loss: 2.043550
Epoch 1/1... Discriminator Loss: 1.131611... Generator Loss: 1.131333
Epoch 1/1... Discriminator Loss: 0.957048... Generator Loss: 1.144477
Epoch 1/1... Discriminator Loss: 1.017782... Generator Loss: 1.495223
Epoch 1/1... Discriminator Loss: 1.161963... Generator Loss: 1.293685
Epoch 1/1... Discriminator Loss: 1.115403... Generator Loss: 1.583195
Epoch 1/1... Discriminator Loss: 0.847681... Generator Loss: 1.358330
Epoch 1/1... Discriminator Loss: 1.164833... Generator Loss: 1.482294
Epoch 1/1... Discriminator Loss: 1.067548... Generator Loss: 1.277635
Epoch 1/1... Discriminator Loss: 0.978015... Generator Loss: 1.472753
Epoch 1/1... Discriminator Loss: 1.153736... Generator Loss: 1.580114
Epoch 1/1... Discriminator Loss: 1.396288... Generator Loss: 1.219600
Epoch 1/1... Discriminator Loss: 0.917564... Generator Loss: 1.581593
Epoch 1/1... Discriminator Loss: 1.369634... Generator Loss: 1.104423
Epoch 1/1... Discriminator Loss: 1.091333... Generator Loss: 1.387165
Epoch 1/1... Discriminator Loss: 1.166008... Generator Loss: 1.323575
Epoch 1/1... Discriminator Loss: 1.177436... Generator Loss: 1.430419
Epoch 1/1... Discriminator Loss: 1.194488... Generator Loss: 1.123032
Epoch 1/1... Discriminator Loss: 1.414672... Generator Loss: 1.155650
Epoch 1/1... Discriminator Loss: 1.257424... Generator Loss: 1.508981
Epoch 1/1... Discriminator Loss: 1.057808... Generator Loss: 1.329378
Epoch 1/1... Discriminator Loss: 1.046233... Generator Loss: 1.091412
Epoch 1/1... Discriminator Loss: 1.101277... Generator Loss: 1.240825
Epoch 1/1... Discriminator Loss: 1.434249... Generator Loss: 1.140487
Epoch 1/1... Discriminator Loss: 1.245934... Generator Loss: 1.210191
Epoch 1/1... Discriminator Loss: 1.337211... Generator Loss: 1.391875
Epoch 1/1... Discriminator Loss: 1.429495... Generator Loss: 1.092726
Epoch 1/1... Discriminator Loss: 1.133336... Generator Loss: 1.279099
Epoch 1/1... Discriminator Loss: 1.045753... Generator Loss: 1.709867
Epoch 1/1... Discriminator Loss: 1.439016... Generator Loss: 1.312521
Epoch 1/1... Discriminator Loss: 1.150885... Generator Loss: 1.217211
Epoch 1/1... Discriminator Loss: 1.083664... Generator Loss: 1.353116
Epoch 1/1... Discriminator Loss: 1.056338... Generator Loss: 1.350245
Epoch 1/1... Discriminator Loss: 1.303047... Generator Loss: 1.407858
Epoch 1/1... Discriminator Loss: 1.110517... Generator Loss: 1.251735
Epoch 1/1... Discriminator Loss: 1.267339... Generator Loss: 1.160060
Epoch 1/1... Discriminator Loss: 1.370310... Generator Loss: 1.279036
Epoch 1/1... Discriminator Loss: 1.071754... Generator Loss: 1.004870
Epoch 1/1... Discriminator Loss: 1.324221... Generator Loss: 1.118131
Epoch 1/1... Discriminator Loss: 1.087928... Generator Loss: 1.251659
Epoch 1/1... Discriminator Loss: 1.052799... Generator Loss: 1.443655
Epoch 1/1... Discriminator Loss: 1.223484... Generator Loss: 1.214445
Epoch 1/1... Discriminator Loss: 1.160449... Generator Loss: 1.331851
Epoch 1/1... Discriminator Loss: 1.128115... Generator Loss: 1.219409
Epoch 1/1... Discriminator Loss: 1.018740... Generator Loss: 1.443889
Epoch 1/1... Discriminator Loss: 1.111585... Generator Loss: 1.218983
Epoch 1/1... Discriminator Loss: 0.913025... Generator Loss: 1.379914
Epoch 1/1... Discriminator Loss: 1.234725... Generator Loss: 1.213493
Epoch 1/1... Discriminator Loss: 0.900389... Generator Loss: 1.689220
Epoch 1/1... Discriminator Loss: 1.143032... Generator Loss: 1.537791
Epoch 1/1... Discriminator Loss: 1.026092... Generator Loss: 1.426048
Epoch 1/1... Discriminator Loss: 0.966622... Generator Loss: 1.438283
Epoch 1/1... Discriminator Loss: 1.173527... Generator Loss: 1.012174
Epoch 1/1... Discriminator Loss: 1.222456... Generator Loss: 1.106194
Epoch 1/1... Discriminator Loss: 1.056130... Generator Loss: 1.216562
Epoch 1/1... Discriminator Loss: 1.290740... Generator Loss: 1.025403
Epoch 1/1... Discriminator Loss: 1.114008... Generator Loss: 1.228842
Epoch 1/1... Discriminator Loss: 1.288530... Generator Loss: 0.919064
Epoch 1/1... Discriminator Loss: 1.178439... Generator Loss: 1.024835
Epoch 1/1... Discriminator Loss: 1.079524... Generator Loss: 1.271102
Epoch 1/1... Discriminator Loss: 1.052847... Generator Loss: 1.651165
Epoch 1/1... Discriminator Loss: 1.173108... Generator Loss: 1.267261
Epoch 1/1... Discriminator Loss: 1.040313... Generator Loss: 1.218180
Epoch 1/1... Discriminator Loss: 1.196021... Generator Loss: 1.353311
Epoch 1/1... Discriminator Loss: 1.157822... Generator Loss: 1.007868
Epoch 1/1... Discriminator Loss: 0.927710... Generator Loss: 1.358269
Epoch 1/1... Discriminator Loss: 1.168858... Generator Loss: 1.349927
Epoch 1/1... Discriminator Loss: 1.063547... Generator Loss: 1.507398
Epoch 1/1... Discriminator Loss: 1.138759... Generator Loss: 1.212418
Epoch 1/1... Discriminator Loss: 0.979768... Generator Loss: 1.432612
Epoch 1/1... Discriminator Loss: 0.969221... Generator Loss: 1.419867
Epoch 1/1... Discriminator Loss: 0.964336... Generator Loss: 1.282375
Epoch 1/1... Discriminator Loss: 1.343719... Generator Loss: 1.031261
Epoch 1/1... Discriminator Loss: 1.159267... Generator Loss: 1.278905
Epoch 1/1... Discriminator Loss: 0.924167... Generator Loss: 1.567369
Epoch 1/1... Discriminator Loss: 1.324538... Generator Loss: 1.142109
Epoch 1/1... Discriminator Loss: 0.968440... Generator Loss: 1.472397
Epoch 1/1... Discriminator Loss: 1.126323... Generator Loss: 1.206172
Epoch 1/1... Discriminator Loss: 1.137621... Generator Loss: 1.305021
Epoch 1/1... Discriminator Loss: 1.180893... Generator Loss: 1.178107
Epoch 1/1... Discriminator Loss: 0.989460... Generator Loss: 1.429853
Epoch 1/1... Discriminator Loss: 0.971394... Generator Loss: 1.574598
Epoch 1/1... Discriminator Loss: 1.051028... Generator Loss: 1.237449
Epoch 1/1... Discriminator Loss: 1.041888... Generator Loss: 1.323275
Epoch 1/1... Discriminator Loss: 1.316472... Generator Loss: 1.067965
Epoch 1/1... Discriminator Loss: 1.351057... Generator Loss: 1.192091
Epoch 1/1... Discriminator Loss: 1.255995... Generator Loss: 1.127856
Epoch 1/1... Discriminator Loss: 1.281567... Generator Loss: 0.820329
Epoch 1/1... Discriminator Loss: 1.183977... Generator Loss: 1.190013
Epoch 1/1... Discriminator Loss: 1.197003... Generator Loss: 1.094012
Epoch 1/1... Discriminator Loss: 1.264789... Generator Loss: 1.206601
Epoch 1/1... Discriminator Loss: 1.137564... Generator Loss: 1.642171
Epoch 1/1... Discriminator Loss: 0.984014... Generator Loss: 1.302850
Epoch 1/1... Discriminator Loss: 1.319975... Generator Loss: 0.958513
Epoch 1/1... Discriminator Loss: 1.002399... Generator Loss: 1.106367
Epoch 1/1... Discriminator Loss: 1.088311... Generator Loss: 1.412805
Epoch 1/1... Discriminator Loss: 1.073499... Generator Loss: 1.696898
Epoch 1/1... Discriminator Loss: 1.236439... Generator Loss: 1.209104
Epoch 1/1... Discriminator Loss: 1.065851... Generator Loss: 1.017862
Epoch 1/1... Discriminator Loss: 1.242438... Generator Loss: 1.557669
Epoch 1/1... Discriminator Loss: 1.109009... Generator Loss: 1.051260
Epoch 1/1... Discriminator Loss: 0.994230... Generator Loss: 1.453035
Epoch 1/1... Discriminator Loss: 1.314937... Generator Loss: 1.238042
Epoch 1/1... Discriminator Loss: 1.094468... Generator Loss: 1.223160
Epoch 1/1... Discriminator Loss: 1.055255... Generator Loss: 1.136679
Epoch 1/1... Discriminator Loss: 1.240059... Generator Loss: 1.467418
Epoch 1/1... Discriminator Loss: 1.202783... Generator Loss: 1.332685
Epoch 1/1... Discriminator Loss: 1.058326... Generator Loss: 1.343371
Epoch 1/1... Discriminator Loss: 1.292189... Generator Loss: 1.203463
Epoch 1/1... Discriminator Loss: 1.249687... Generator Loss: 1.448294
Epoch 1/1... Discriminator Loss: 1.222533... Generator Loss: 1.072784
Epoch 1/1... Discriminator Loss: 1.213587... Generator Loss: 1.304550
Epoch 1/1... Discriminator Loss: 1.358210... Generator Loss: 1.126355
Epoch 1/1... Discriminator Loss: 0.944764... Generator Loss: 1.523393
Epoch 1/1... Discriminator Loss: 1.389266... Generator Loss: 0.921006
Epoch 1/1... Discriminator Loss: 1.119295... Generator Loss: 1.292246
Epoch 1/1... Discriminator Loss: 1.254365... Generator Loss: 1.098228
Epoch 1/1... Discriminator Loss: 1.115142... Generator Loss: 1.355329
Epoch 1/1... Discriminator Loss: 1.151460... Generator Loss: 1.575447
Epoch 1/1... Discriminator Loss: 0.884397... Generator Loss: 1.355134
Epoch 1/1... Discriminator Loss: 1.066632... Generator Loss: 1.181429
Epoch 1/1... Discriminator Loss: 1.055942... Generator Loss: 1.165948
Epoch 1/1... Discriminator Loss: 1.027706... Generator Loss: 1.307552
Epoch 1/1... Discriminator Loss: 1.159173... Generator Loss: 1.377064
Epoch 1/1... Discriminator Loss: 1.183246... Generator Loss: 1.597645
Epoch 1/1... Discriminator Loss: 1.269782... Generator Loss: 1.247395
Epoch 1/1... Discriminator Loss: 1.016693... Generator Loss: 1.159402
Epoch 1/1... Discriminator Loss: 1.130946... Generator Loss: 1.427134
Epoch 1/1... Discriminator Loss: 1.462515... Generator Loss: 0.879589
Epoch 1/1... Discriminator Loss: 1.033954... Generator Loss: 1.223881
Epoch 1/1... Discriminator Loss: 1.256188... Generator Loss: 1.192585
Epoch 1/1... Discriminator Loss: 0.973932... Generator Loss: 1.238712
Epoch 1/1... Discriminator Loss: 1.139099... Generator Loss: 1.367938
Epoch 1/1... Discriminator Loss: 1.043054... Generator Loss: 1.317312
Epoch 1/1... Discriminator Loss: 0.921892... Generator Loss: 1.326364
Epoch 1/1... Discriminator Loss: 1.232314... Generator Loss: 1.208735
Epoch 1/1... Discriminator Loss: 1.076911... Generator Loss: 1.294324
Epoch 1/1... Discriminator Loss: 1.203080... Generator Loss: 1.020883
Epoch 1/1... Discriminator Loss: 1.139019... Generator Loss: 1.368175
Epoch 1/1... Discriminator Loss: 1.243133... Generator Loss: 1.173075
Epoch 1/1... Discriminator Loss: 0.921742... Generator Loss: 1.475466
Epoch 1/1... Discriminator Loss: 0.985358... Generator Loss: 1.134770
Epoch 1/1... Discriminator Loss: 1.188933... Generator Loss: 1.132840
Epoch 1/1... Discriminator Loss: 1.436667... Generator Loss: 1.146549
Epoch 1/1... Discriminator Loss: 1.068043... Generator Loss: 1.337072
Epoch 1/1... Discriminator Loss: 1.276279... Generator Loss: 1.279693
Epoch 1/1... Discriminator Loss: 1.302401... Generator Loss: 1.437334
Epoch 1/1... Discriminator Loss: 1.126023... Generator Loss: 1.293885
Epoch 1/1... Discriminator Loss: 1.425530... Generator Loss: 0.974145
Epoch 1/1... Discriminator Loss: 1.166698... Generator Loss: 1.029372
Epoch 1/1... Discriminator Loss: 1.171303... Generator Loss: 1.237582
Epoch 1/1... Discriminator Loss: 1.149644... Generator Loss: 1.283165
Epoch 1/1... Discriminator Loss: 0.864899... Generator Loss: 1.274338
Epoch 1/1... Discriminator Loss: 0.971027... Generator Loss: 1.458804
Epoch 1/1... Discriminator Loss: 0.938928... Generator Loss: 1.263028
Epoch 1/1... Discriminator Loss: 1.219030... Generator Loss: 1.073441
Epoch 1/1... Discriminator Loss: 0.927881... Generator Loss: 1.338853
Epoch 1/1... Discriminator Loss: 1.231848... Generator Loss: 1.075807
Epoch 1/1... Discriminator Loss: 1.095681... Generator Loss: 1.410773
Epoch 1/1... Discriminator Loss: 1.031677... Generator Loss: 1.173307
Epoch 1/1... Discriminator Loss: 1.201856... Generator Loss: 0.933226
Epoch 1/1... Discriminator Loss: 1.238822... Generator Loss: 0.978179
Epoch 1/1... Discriminator Loss: 1.046638... Generator Loss: 1.172244
Epoch 1/1... Discriminator Loss: 1.161379... Generator Loss: 1.723806
Epoch 1/1... Discriminator Loss: 1.109792... Generator Loss: 1.017403
Epoch 1/1... Discriminator Loss: 1.174394... Generator Loss: 1.290154
Epoch 1/1... Discriminator Loss: 1.006832... Generator Loss: 1.201365
Epoch 1/1... Discriminator Loss: 0.906785... Generator Loss: 1.423695
Epoch 1/1... Discriminator Loss: 1.062707... Generator Loss: 1.197857
Epoch 1/1... Discriminator Loss: 1.191501... Generator Loss: 1.233417
Epoch 1/1... Discriminator Loss: 1.279520... Generator Loss: 1.467689
Epoch 1/1... Discriminator Loss: 1.104095... Generator Loss: 0.854726
Epoch 1/1... Discriminator Loss: 1.095392... Generator Loss: 1.210968
Epoch 1/1... Discriminator Loss: 1.208958... Generator Loss: 1.277792
Epoch 1/1... Discriminator Loss: 1.281206... Generator Loss: 1.227352
Epoch 1/1... Discriminator Loss: 1.195105... Generator Loss: 1.054322
Epoch 1/1... Discriminator Loss: 1.010622... Generator Loss: 1.356742
Epoch 1/1... Discriminator Loss: 1.137340... Generator Loss: 1.170314
Epoch 1/1... Discriminator Loss: 1.184492... Generator Loss: 1.310892
Epoch 1/1... Discriminator Loss: 1.269602... Generator Loss: 1.150148
Epoch 1/1... Discriminator Loss: 1.122833... Generator Loss: 1.119081
Epoch 1/1... Discriminator Loss: 1.353411... Generator Loss: 1.001481
Epoch 1/1... Discriminator Loss: 1.026371... Generator Loss: 1.385875
Epoch 1/1... Discriminator Loss: 1.212052... Generator Loss: 0.978284
Epoch 1/1... Discriminator Loss: 1.051089... Generator Loss: 1.333032
Epoch 1/1... Discriminator Loss: 1.311539... Generator Loss: 1.078519
Epoch 1/1... Discriminator Loss: 1.080958... Generator Loss: 1.264856
Epoch 1/1... Discriminator Loss: 1.268092... Generator Loss: 1.028698
Epoch 1/1... Discriminator Loss: 1.201673... Generator Loss: 1.131349
Epoch 1/1... Discriminator Loss: 1.233946... Generator Loss: 0.951428
Epoch 1/1... Discriminator Loss: 1.106897... Generator Loss: 1.317301
Epoch 1/1... Discriminator Loss: 1.418047... Generator Loss: 0.900902
Epoch 1/1... Discriminator Loss: 1.184053... Generator Loss: 1.366964
Epoch 1/1... Discriminator Loss: 1.257861... Generator Loss: 1.160880
Epoch 1/1... Discriminator Loss: 1.089725... Generator Loss: 1.125172
Epoch 1/1... Discriminator Loss: 1.118685... Generator Loss: 1.358909
Epoch 1/1... Discriminator Loss: 1.278973... Generator Loss: 1.017996
Epoch 1/1... Discriminator Loss: 1.213505... Generator Loss: 1.245998
Epoch 1/1... Discriminator Loss: 1.189970... Generator Loss: 1.270155
Epoch 1/1... Discriminator Loss: 1.243243... Generator Loss: 1.154035
Epoch 1/1... Discriminator Loss: 1.387085... Generator Loss: 0.867814
Epoch 1/1... Discriminator Loss: 1.057628... Generator Loss: 1.229868
Epoch 1/1... Discriminator Loss: 1.186767... Generator Loss: 1.269556
Epoch 1/1... Discriminator Loss: 1.333729... Generator Loss: 1.112696
Epoch 1/1... Discriminator Loss: 1.179558... Generator Loss: 1.352654
Epoch 1/1... Discriminator Loss: 1.404431... Generator Loss: 1.185816
Epoch 1/1... Discriminator Loss: 1.232621... Generator Loss: 1.214691
Epoch 1/1... Discriminator Loss: 1.362685... Generator Loss: 0.793535
Epoch 1/1... Discriminator Loss: 1.363056... Generator Loss: 0.964909
Epoch 1/1... Discriminator Loss: 1.124010... Generator Loss: 1.577408
Epoch 1/1... Discriminator Loss: 1.164279... Generator Loss: 1.143430
Epoch 1/1... Discriminator Loss: 1.059108... Generator Loss: 1.586433
Epoch 1/1... Discriminator Loss: 1.237262... Generator Loss: 1.112825
Epoch 1/1... Discriminator Loss: 1.015461... Generator Loss: 1.081359
Epoch 1/1... Discriminator Loss: 1.076535... Generator Loss: 1.416878
Epoch 1/1... Discriminator Loss: 1.167848... Generator Loss: 1.185667
Epoch 1/1... Discriminator Loss: 1.109411... Generator Loss: 1.150723
Epoch 1/1... Discriminator Loss: 1.300764... Generator Loss: 1.019651
Epoch 1/1... Discriminator Loss: 1.152715... Generator Loss: 1.442147
Epoch 1/1... Discriminator Loss: 1.545847... Generator Loss: 1.140196
Epoch 1/1... Discriminator Loss: 1.037029... Generator Loss: 1.329750
Epoch 1/1... Discriminator Loss: 1.057357... Generator Loss: 1.094433
Epoch 1/1... Discriminator Loss: 1.184160... Generator Loss: 1.147710
Epoch 1/1... Discriminator Loss: 1.177253... Generator Loss: 1.089736
Epoch 1/1... Discriminator Loss: 1.346648... Generator Loss: 1.024311
Epoch 1/1... Discriminator Loss: 1.118006... Generator Loss: 1.209850
Epoch 1/1... Discriminator Loss: 1.161625... Generator Loss: 0.972378
Epoch 1/1... Discriminator Loss: 1.155359... Generator Loss: 0.991527
Epoch 1/1... Discriminator Loss: 1.281017... Generator Loss: 1.040098
Epoch 1/1... Discriminator Loss: 1.160541... Generator Loss: 1.438912
Epoch 1/1... Discriminator Loss: 0.960458... Generator Loss: 1.396521
Epoch 1/1... Discriminator Loss: 1.189305... Generator Loss: 0.868153
Epoch 1/1... Discriminator Loss: 1.191973... Generator Loss: 1.080329
Epoch 1/1... Discriminator Loss: 1.041936... Generator Loss: 1.209654
Epoch 1/1... Discriminator Loss: 1.136719... Generator Loss: 1.544661
Epoch 1/1... Discriminator Loss: 1.152117... Generator Loss: 1.043717
Epoch 1/1... Discriminator Loss: 1.357711... Generator Loss: 0.930034
Epoch 1/1... Discriminator Loss: 1.020994... Generator Loss: 1.817240
Epoch 1/1... Discriminator Loss: 1.130542... Generator Loss: 0.982858
Epoch 1/1... Discriminator Loss: 1.121720... Generator Loss: 1.152058
Epoch 1/1... Discriminator Loss: 1.125095... Generator Loss: 1.124902
Epoch 1/1... Discriminator Loss: 1.204409... Generator Loss: 1.346624
Epoch 1/1... Discriminator Loss: 1.147348... Generator Loss: 1.181873
Epoch 1/1... Discriminator Loss: 1.102218... Generator Loss: 1.277575
Epoch 1/1... Discriminator Loss: 1.247526... Generator Loss: 0.870874
Epoch 1/1... Discriminator Loss: 1.112187... Generator Loss: 1.205240
Epoch 1/1... Discriminator Loss: 1.152304... Generator Loss: 1.301788
Epoch 1/1... Discriminator Loss: 1.286445... Generator Loss: 1.048010
Epoch 1/1... Discriminator Loss: 1.243311... Generator Loss: 1.344595
Epoch 1/1... Discriminator Loss: 1.264370... Generator Loss: 1.152738
Epoch 1/1... Discriminator Loss: 1.412311... Generator Loss: 0.887752
Epoch 1/1... Discriminator Loss: 1.354861... Generator Loss: 0.958100
Epoch 1/1... Discriminator Loss: 0.978676... Generator Loss: 1.474602
Epoch 1/1... Discriminator Loss: 1.204692... Generator Loss: 1.193900
Epoch 1/1... Discriminator Loss: 1.455293... Generator Loss: 0.898415
Epoch 1/1... Discriminator Loss: 1.268659... Generator Loss: 1.025829
Epoch 1/1... Discriminator Loss: 1.469894... Generator Loss: 0.849217
Epoch 1/1... Discriminator Loss: 1.094700... Generator Loss: 1.150928
Epoch 1/1... Discriminator Loss: 1.058518... Generator Loss: 1.297142
Epoch 1/1... Discriminator Loss: 1.021853... Generator Loss: 1.169372
Epoch 1/1... Discriminator Loss: 1.080041... Generator Loss: 1.085827
Epoch 1/1... Discriminator Loss: 1.302895... Generator Loss: 0.987330
Epoch 1/1... Discriminator Loss: 1.165751... Generator Loss: 1.504339
Epoch 1/1... Discriminator Loss: 1.197456... Generator Loss: 1.034858
Epoch 1/1... Discriminator Loss: 1.176862... Generator Loss: 0.936773
Epoch 1/1... Discriminator Loss: 1.093390... Generator Loss: 1.071664
Epoch 1/1... Discriminator Loss: 1.309652... Generator Loss: 1.177189
Epoch 1/1... Discriminator Loss: 1.040935... Generator Loss: 1.098436
Epoch 1/1... Discriminator Loss: 1.378464... Generator Loss: 1.292877
Epoch 1/1... Discriminator Loss: 1.094898... Generator Loss: 1.428869
Epoch 1/1... Discriminator Loss: 1.350664... Generator Loss: 1.093629
Epoch 1/1... Discriminator Loss: 1.052512... Generator Loss: 1.011799
Epoch 1/1... Discriminator Loss: 0.890743... Generator Loss: 1.389085
Epoch 1/1... Discriminator Loss: 1.221359... Generator Loss: 0.994098
Epoch 1/1... Discriminator Loss: 1.186363... Generator Loss: 0.995212
Epoch 1/1... Discriminator Loss: 1.166172... Generator Loss: 1.102533
Epoch 1/1... Discriminator Loss: 1.212054... Generator Loss: 1.250469
Epoch 1/1... Discriminator Loss: 1.194756... Generator Loss: 1.358191
Epoch 1/1... Discriminator Loss: 1.171522... Generator Loss: 1.222058
Epoch 1/1... Discriminator Loss: 1.249233... Generator Loss: 1.025265
Epoch 1/1... Discriminator Loss: 1.195175... Generator Loss: 1.172145
Epoch 1/1... Discriminator Loss: 1.020121... Generator Loss: 1.437032
Epoch 1/1... Discriminator Loss: 1.131778... Generator Loss: 1.178795
Epoch 1/1... Discriminator Loss: 1.256031... Generator Loss: 1.195221
Epoch 1/1... Discriminator Loss: 1.280077... Generator Loss: 1.241436
Epoch 1/1... Discriminator Loss: 1.117887... Generator Loss: 0.794583
Epoch 1/1... Discriminator Loss: 1.000890... Generator Loss: 1.405206
Epoch 1/1... Discriminator Loss: 1.176641... Generator Loss: 1.249949
Epoch 1/1... Discriminator Loss: 1.072717... Generator Loss: 1.075979
Epoch 1/1... Discriminator Loss: 1.080544... Generator Loss: 1.144668
Epoch 1/1... Discriminator Loss: 1.150269... Generator Loss: 0.792034
Epoch 1/1... Discriminator Loss: 1.259800... Generator Loss: 1.038086
Epoch 1/1... Discriminator Loss: 1.160468... Generator Loss: 1.294143
Epoch 1/1... Discriminator Loss: 1.444506... Generator Loss: 0.887875
Epoch 1/1... Discriminator Loss: 1.297336... Generator Loss: 1.059225
Epoch 1/1... Discriminator Loss: 1.165036... Generator Loss: 1.039947
Epoch 1/1... Discriminator Loss: 1.051927... Generator Loss: 1.038635
Epoch 1/1... Discriminator Loss: 1.182443... Generator Loss: 1.109186
Epoch 1/1... Discriminator Loss: 1.279046... Generator Loss: 1.535591
Epoch 1/1... Discriminator Loss: 1.043833... Generator Loss: 1.378091
Epoch 1/1... Discriminator Loss: 1.174453... Generator Loss: 0.819622
Epoch 1/1... Discriminator Loss: 1.330625... Generator Loss: 0.704206
Epoch 1/1... Discriminator Loss: 1.221244... Generator Loss: 1.129652
Epoch 1/1... Discriminator Loss: 1.215385... Generator Loss: 1.251042
Epoch 1/1... Discriminator Loss: 1.106246... Generator Loss: 1.278668
Epoch 1/1... Discriminator Loss: 1.039216... Generator Loss: 1.114542
Epoch 1/1... Discriminator Loss: 1.088412... Generator Loss: 1.496521
Epoch 1/1... Discriminator Loss: 0.934242... Generator Loss: 1.348969
Epoch 1/1... Discriminator Loss: 1.201761... Generator Loss: 1.359256
Epoch 1/1... Discriminator Loss: 1.062231... Generator Loss: 1.242308
Epoch 1/1... Discriminator Loss: 1.004766... Generator Loss: 1.335230
Epoch 1/1... Discriminator Loss: 0.874132... Generator Loss: 1.392646
Epoch 1/1... Discriminator Loss: 0.989960... Generator Loss: 1.514671
Epoch 1/1... Discriminator Loss: 1.092739... Generator Loss: 1.109451
Epoch 1/1... Discriminator Loss: 1.172322... Generator Loss: 1.446495
Epoch 1/1... Discriminator Loss: 1.027934... Generator Loss: 1.486726
Epoch 1/1... Discriminator Loss: 1.347695... Generator Loss: 1.108950
Epoch 1/1... Discriminator Loss: 0.983551... Generator Loss: 1.183910
Epoch 1/1... Discriminator Loss: 1.187081... Generator Loss: 1.075781
Epoch 1/1... Discriminator Loss: 1.105168... Generator Loss: 1.402093
Epoch 1/1... Discriminator Loss: 1.184514... Generator Loss: 0.950939
Epoch 1/1... Discriminator Loss: 1.194834... Generator Loss: 1.332840
Epoch 1/1... Discriminator Loss: 1.047495... Generator Loss: 1.444438
Epoch 1/1... Discriminator Loss: 1.158741... Generator Loss: 1.328894
Epoch 1/1... Discriminator Loss: 1.144546... Generator Loss: 1.292763
Epoch 1/1... Discriminator Loss: 1.043618... Generator Loss: 1.296545
Epoch 1/1... Discriminator Loss: 1.033741... Generator Loss: 0.995691
Epoch 1/1... Discriminator Loss: 1.038639... Generator Loss: 1.259869
Epoch 1/1... Discriminator Loss: 1.152513... Generator Loss: 1.444473
Epoch 1/1... Discriminator Loss: 1.263845... Generator Loss: 0.869258
Epoch 1/1... Discriminator Loss: 1.102450... Generator Loss: 1.399060
Epoch 1/1... Discriminator Loss: 0.978676... Generator Loss: 1.257202
Epoch 1/1... Discriminator Loss: 1.092658... Generator Loss: 1.122890
Epoch 1/1... Discriminator Loss: 1.168596... Generator Loss: 1.293823
Epoch 1/1... Discriminator Loss: 1.206906... Generator Loss: 1.255158
Epoch 1/1... Discriminator Loss: 1.336400... Generator Loss: 1.543052
Epoch 1/1... Discriminator Loss: 1.090508... Generator Loss: 0.895699
Epoch 1/1... Discriminator Loss: 1.254007... Generator Loss: 1.346165
Epoch 1/1... Discriminator Loss: 1.207347... Generator Loss: 1.257814
Epoch 1/1... Discriminator Loss: 1.159511... Generator Loss: 1.205286
Epoch 1/1... Discriminator Loss: 1.092109... Generator Loss: 1.189625
Epoch 1/1... Discriminator Loss: 1.178636... Generator Loss: 1.003640
Epoch 1/1... Discriminator Loss: 1.049219... Generator Loss: 1.306287
Epoch 1/1... Discriminator Loss: 1.426434... Generator Loss: 1.007116
Epoch 1/1... Discriminator Loss: 1.461631... Generator Loss: 0.911548
Epoch 1/1... Discriminator Loss: 1.290112... Generator Loss: 0.905529
Epoch 1/1... Discriminator Loss: 1.019809... Generator Loss: 1.097818
Epoch 1/1... Discriminator Loss: 1.311695... Generator Loss: 1.246418
Epoch 1/1... Discriminator Loss: 1.170227... Generator Loss: 1.260452
Epoch 1/1... Discriminator Loss: 0.893031... Generator Loss: 1.163640
Epoch 1/1... Discriminator Loss: 1.156090... Generator Loss: 1.372738
Epoch 1/1... Discriminator Loss: 1.300331... Generator Loss: 0.889560
Epoch 1/1... Discriminator Loss: 1.046623... Generator Loss: 1.046697
Epoch 1/1... Discriminator Loss: 1.062168... Generator Loss: 1.307810
Epoch 1/1... Discriminator Loss: 1.289748... Generator Loss: 0.864562
Epoch 1/1... Discriminator Loss: 1.055039... Generator Loss: 1.103714
Epoch 1/1... Discriminator Loss: 1.150895... Generator Loss: 1.053078
Epoch 1/1... Discriminator Loss: 1.143449... Generator Loss: 1.159471
Epoch 1/1... Discriminator Loss: 1.135683... Generator Loss: 1.370622
Epoch 1/1... Discriminator Loss: 1.223970... Generator Loss: 0.980428
Epoch 1/1... Discriminator Loss: 1.164294... Generator Loss: 1.134545
Epoch 1/1... Discriminator Loss: 0.956124... Generator Loss: 0.975754
Epoch 1/1... Discriminator Loss: 1.348623... Generator Loss: 1.053759
Epoch 1/1... Discriminator Loss: 1.143824... Generator Loss: 1.229771
Epoch 1/1... Discriminator Loss: 1.195541... Generator Loss: 1.110972
Epoch 1/1... Discriminator Loss: 1.557767... Generator Loss: 0.773004
Epoch 1/1... Discriminator Loss: 1.401490... Generator Loss: 1.113380
Epoch 1/1... Discriminator Loss: 1.316939... Generator Loss: 0.704524
Epoch 1/1... Discriminator Loss: 1.260827... Generator Loss: 0.932175
Epoch 1/1... Discriminator Loss: 1.124016... Generator Loss: 1.443036
Epoch 1/1... Discriminator Loss: 1.220385... Generator Loss: 1.210209
Epoch 1/1... Discriminator Loss: 1.034358... Generator Loss: 1.488097
Epoch 1/1... Discriminator Loss: 1.290892... Generator Loss: 0.888060
Epoch 1/1... Discriminator Loss: 1.153976... Generator Loss: 1.067564
Epoch 1/1... Discriminator Loss: 1.220700... Generator Loss: 0.989237
Epoch 1/1... Discriminator Loss: 1.227034... Generator Loss: 0.823804
Epoch 1/1... Discriminator Loss: 1.061765... Generator Loss: 1.236748
Epoch 1/1... Discriminator Loss: 1.089843... Generator Loss: 1.215622
Epoch 1/1... Discriminator Loss: 1.275367... Generator Loss: 0.973165
Epoch 1/1... Discriminator Loss: 1.204756... Generator Loss: 0.962675
Epoch 1/1... Discriminator Loss: 1.245228... Generator Loss: 1.027396
Epoch 1/1... Discriminator Loss: 1.211086... Generator Loss: 1.115794
Epoch 1/1... Discriminator Loss: 1.217030... Generator Loss: 1.064570
Epoch 1/1... Discriminator Loss: 1.150111... Generator Loss: 0.986720
Epoch 1/1... Discriminator Loss: 1.099165... Generator Loss: 1.104045
Epoch 1/1... Discriminator Loss: 1.186178... Generator Loss: 0.943437
Epoch 1/1... Discriminator Loss: 1.228167... Generator Loss: 1.178116
Epoch 1/1... Discriminator Loss: 1.228227... Generator Loss: 1.315965
Epoch 1/1... Discriminator Loss: 1.183024... Generator Loss: 1.064916
Epoch 1/1... Discriminator Loss: 1.234262... Generator Loss: 1.209488
Epoch 1/1... Discriminator Loss: 1.008483... Generator Loss: 0.921822
Epoch 1/1... Discriminator Loss: 1.212210... Generator Loss: 0.860305
Epoch 1/1... Discriminator Loss: 1.138498... Generator Loss: 1.169926
Epoch 1/1... Discriminator Loss: 1.073824... Generator Loss: 1.054874
Epoch 1/1... Discriminator Loss: 1.144868... Generator Loss: 1.312296
Epoch 1/1... Discriminator Loss: 0.997696... Generator Loss: 1.192711
Epoch 1/1... Discriminator Loss: 1.340962... Generator Loss: 1.115419
Epoch 1/1... Discriminator Loss: 1.086916... Generator Loss: 1.159152
Epoch 1/1... Discriminator Loss: 1.342747... Generator Loss: 0.946762
Epoch 1/1... Discriminator Loss: 1.116859... Generator Loss: 1.379138
Epoch 1/1... Discriminator Loss: 1.243571... Generator Loss: 0.931460
Epoch 1/1... Discriminator Loss: 1.317636... Generator Loss: 1.263751
Epoch 1/1... Discriminator Loss: 1.146463... Generator Loss: 1.135127
Epoch 1/1... Discriminator Loss: 1.137282... Generator Loss: 0.922163
Epoch 1/1... Discriminator Loss: 1.217662... Generator Loss: 1.141561
Epoch 1/1... Discriminator Loss: 0.981750... Generator Loss: 0.781745
Epoch 1/1... Discriminator Loss: 1.144605... Generator Loss: 1.250257
Epoch 1/1... Discriminator Loss: 0.976078... Generator Loss: 1.312971
Epoch 1/1... Discriminator Loss: 1.231296... Generator Loss: 0.921767
Epoch 1/1... Discriminator Loss: 1.248803... Generator Loss: 1.347277
Epoch 1/1... Discriminator Loss: 1.157168... Generator Loss: 0.754124
Epoch 1/1... Discriminator Loss: 1.100231... Generator Loss: 1.302968
Epoch 1/1... Discriminator Loss: 1.170490... Generator Loss: 1.114390
Epoch 1/1... Discriminator Loss: 1.267347... Generator Loss: 1.055395
Epoch 1/1... Discriminator Loss: 1.178465... Generator Loss: 0.889222
Epoch 1/1... Discriminator Loss: 1.349785... Generator Loss: 0.810484
Epoch 1/1... Discriminator Loss: 1.127863... Generator Loss: 1.090335
Epoch 1/1... Discriminator Loss: 1.255519... Generator Loss: 0.940400
Epoch 1/1... Discriminator Loss: 1.078072... Generator Loss: 1.265378
Epoch 1/1... Discriminator Loss: 1.120260... Generator Loss: 0.988840
Epoch 1/1... Discriminator Loss: 1.176741... Generator Loss: 1.177860
Epoch 1/1... Discriminator Loss: 1.167692... Generator Loss: 0.903898
Epoch 1/1... Discriminator Loss: 1.227632... Generator Loss: 1.327069
Epoch 1/1... Discriminator Loss: 1.114380... Generator Loss: 1.280118
Epoch 1/1... Discriminator Loss: 1.196607... Generator Loss: 1.412142
Epoch 1/1... Discriminator Loss: 1.356008... Generator Loss: 0.810451
Epoch 1/1... Discriminator Loss: 1.274616... Generator Loss: 0.795032
Epoch 1/1... Discriminator Loss: 1.350477... Generator Loss: 0.990616
Epoch 1/1... Discriminator Loss: 1.165465... Generator Loss: 1.119542
Epoch 1/1... Discriminator Loss: 1.308077... Generator Loss: 1.040785
Epoch 1/1... Discriminator Loss: 1.152861... Generator Loss: 1.258506
Epoch 1/1... Discriminator Loss: 1.197194... Generator Loss: 1.288229
Epoch 1/1... Discriminator Loss: 1.295470... Generator Loss: 0.948124
Epoch 1/1... Discriminator Loss: 1.270483... Generator Loss: 1.041108
Epoch 1/1... Discriminator Loss: 1.205813... Generator Loss: 1.009897
Epoch 1/1... Discriminator Loss: 1.236539... Generator Loss: 1.098998
Epoch 1/1... Discriminator Loss: 1.161214... Generator Loss: 1.159708
Epoch 1/1... Discriminator Loss: 1.170136... Generator Loss: 0.853078
Epoch 1/1... Discriminator Loss: 1.177227... Generator Loss: 0.982566
Epoch 1/1... Discriminator Loss: 1.103550... Generator Loss: 1.346964
Epoch 1/1... Discriminator Loss: 1.125099... Generator Loss: 1.100940
Epoch 1/1... Discriminator Loss: 1.291192... Generator Loss: 0.993405
Epoch 1/1... Discriminator Loss: 0.963799... Generator Loss: 1.062904
Epoch 1/1... Discriminator Loss: 1.089281... Generator Loss: 1.510758
Epoch 1/1... Discriminator Loss: 1.156348... Generator Loss: 1.366738
Epoch 1/1... Discriminator Loss: 1.099845... Generator Loss: 1.247075
Epoch 1/1... Discriminator Loss: 1.184807... Generator Loss: 1.144133
Epoch 1/1... Discriminator Loss: 1.377586... Generator Loss: 1.088324
Epoch 1/1... Discriminator Loss: 1.100749... Generator Loss: 1.111799
Epoch 1/1... Discriminator Loss: 1.121527... Generator Loss: 1.015423
Epoch 1/1... Discriminator Loss: 1.159356... Generator Loss: 1.070095
Epoch 1/1... Discriminator Loss: 1.340899... Generator Loss: 0.957907
Epoch 1/1... Discriminator Loss: 1.190873... Generator Loss: 0.843233
Epoch 1/1... Discriminator Loss: 1.151565... Generator Loss: 1.077441
Epoch 1/1... Discriminator Loss: 1.444668... Generator Loss: 0.950115
Epoch 1/1... Discriminator Loss: 1.230498... Generator Loss: 0.888788
Epoch 1/1... Discriminator Loss: 1.141370... Generator Loss: 1.084084
Epoch 1/1... Discriminator Loss: 1.222133... Generator Loss: 1.085658
Epoch 1/1... Discriminator Loss: 1.277816... Generator Loss: 0.911951
Epoch 1/1... Discriminator Loss: 1.253197... Generator Loss: 1.241328
Epoch 1/1... Discriminator Loss: 1.270073... Generator Loss: 0.822173
Epoch 1/1... Discriminator Loss: 1.277618... Generator Loss: 1.154669
Epoch 1/1... Discriminator Loss: 1.104868... Generator Loss: 1.005567
Epoch 1/1... Discriminator Loss: 1.284233... Generator Loss: 0.890819
Epoch 1/1... Discriminator Loss: 1.191154... Generator Loss: 0.878324
Epoch 1/1... Discriminator Loss: 0.977008... Generator Loss: 1.128697
Epoch 1/1... Discriminator Loss: 1.058872... Generator Loss: 1.396289
Epoch 1/1... Discriminator Loss: 1.198802... Generator Loss: 1.232069
Epoch 1/1... Discriminator Loss: 1.104381... Generator Loss: 1.433837
Epoch 1/1... Discriminator Loss: 1.213018... Generator Loss: 0.875506
Epoch 1/1... Discriminator Loss: 1.020222... Generator Loss: 1.036040
Epoch 1/1... Discriminator Loss: 1.203870... Generator Loss: 1.097233
Epoch 1/1... Discriminator Loss: 1.257864... Generator Loss: 1.089293
Epoch 1/1... Discriminator Loss: 1.096494... Generator Loss: 1.186421
Epoch 1/1... Discriminator Loss: 1.316858... Generator Loss: 0.811461
Epoch 1/1... Discriminator Loss: 1.401544... Generator Loss: 0.848721
Epoch 1/1... Discriminator Loss: 1.265719... Generator Loss: 1.058670
Epoch 1/1... Discriminator Loss: 1.258369... Generator Loss: 1.077444
Epoch 1/1... Discriminator Loss: 1.252216... Generator Loss: 0.954336
Epoch 1/1... Discriminator Loss: 1.186423... Generator Loss: 1.392003
Epoch 1/1... Discriminator Loss: 1.130361... Generator Loss: 1.279684
Epoch 1/1... Discriminator Loss: 1.080788... Generator Loss: 1.210436
Epoch 1/1... Discriminator Loss: 1.368702... Generator Loss: 0.655377
Epoch 1/1... Discriminator Loss: 1.236419... Generator Loss: 1.132313
Epoch 1/1... Discriminator Loss: 1.243768... Generator Loss: 0.977874
Epoch 1/1... Discriminator Loss: 1.106453... Generator Loss: 1.326264

提交项目

提交本项目前,确保运行所有 cells 后保存该文件。

保存该文件为 "dlnd_face_generation.ipynb", 并另存为 HTML 格式 "File" -> "Download as"。提交项目时请附带 "helper.py" 和 "problem_unittests.py" 文件。